Regularized Sparse Kernel Slow Feature Analysis
نویسندگان
چکیده
This paper develops a kernelized slow feature analysis (SFA) algorithm. SFA is an unsupervised learning method to extract features which encode latent variables from time series. Generative relationships are usually complex, and current algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to provide a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small but complex data sets, however, the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. Versatility and performance of our method are demonstrated on audio and video data sets.
منابع مشابه
Regularized Sparse Kernel SFA with Decorrelation Filtering For Separating Correlated Sources
Advances in digital image processing were increased in the past few years. Blind source separation is one of the important research area with numerous applications in signal processing, image processing, telecommunication and speech recognition. In this paper the Blind Source Separation is performed using Slow Feature Analysis(SFA). It is necessary to use multivariate SFA instead of univariate ...
متن کاملSparse Random Feature Algorithm as Coordinate Descent in Hilbert Space
In this paper, we propose a Sparse Random Features algorithm, which learns a sparse non-linear predictor by minimizing an l1-regularized objective function over the Hilbert Space induced from a kernel function. By interpreting the algorithm as Randomized Coordinate Descent in an infinite-dimensional space, we show the proposed approach converges to a solution within ε-precision of that using an...
متن کاملAutomatic Feature Selection via Weighted Kernels and Regularization
Selecting important features in non-linear kernel spaces is a difficult challenge in both classification and regression problems. We propose to achieve feature selection by optimizing a simple criterion: a feature-regularized loss function. Features within the kernel are weighted, and a lasso penalty is placed on these weights to encourage sparsity. We minimize this feature-regularized loss fun...
متن کاملRegularized vector field learning with sparse approximation for mismatch removal
In vector field learning, regularized kernel methods such as regularized least-squares require the number of basis functions to be equivalent to the training sample size, N . The learning process thus has O(N) and O(N) in the time and space complexity respectively. This poses significant burden on the vector learning problem for large datasets. In this paper, we propose a sparse approximation t...
متن کاملParsimonious Support Vector Regression using Orthogonal Forward Selection with the Generalized Kernel Model
Sparse regression modeling is addressed using a generalized kernel model in which kernel regressor has its individually tuned position (center) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append regressors one by one. After the determination of the model structure, namely the selection certain number of regressors, the model weig...
متن کامل